The auditory cortex is the part of the temporal lobe that processes auditory information in humans and many other vertebrates. It is a part of the auditory system, performing basic and higher functions in hearing, such as possible relations to language switching.Cf. Pickles, James O. (2012). An Introduction to the Physiology of Hearing (4th ed.). Bingley, UK: Emerald Group Publishing Limited, p. 238. It is located bilaterally, roughly at the upper sides of the – in humans, curving down and onto the medial surface, on the superior temporal plane, within the lateral sulcus and comprising parts of the transverse temporal gyri, and the superior temporal gyrus, including the planum polare and planum temporale (roughly Brodmann areas 41 and 42, and partially 22).Cf. Pickles, James O. (2012). An Introduction to the Physiology of Hearing (4th ed.). Bingley, UK: Emerald Group Publishing Limited, pp. 215–217.
The auditory cortex takes part in the spectrotemporal, meaning involving time and frequency, analysis of the inputs passed on from the ear. Nearby brain areas then filter and pass on the information to the two streams of speech processing. The auditory cortex's function may help explain why particular brain damage leads to particular outcomes. For example, unilateral destruction, in a region of the auditory pathway above the cochlear nucleus, results in slight hearing loss, whereas bilateral destruction results in cortical deafness.
Besides receiving input from the ears via lower parts of the auditory system, it also transmits signals back to these areas and is interconnected with other parts of the brain. Within the core (A1), its structure preserves tonotopy, the orderly representation of frequency, due to its ability to map low to high frequencies corresponding to the apex and base, respectively, of the cochlea.
Data about the auditory cortex has been obtained through studies in rodents, cats, macaques, and other animals. In humans, the structure and function of the auditory cortex has been studied using functional magnetic resonance imaging (fMRI), electroencephalography (EEG), and electrocorticography.
Neurons in the auditory cortex are organized according to the frequency of sound to which they respond best. at one end of the auditory cortex respond best to low frequencies; neurons at the other respond best to high frequencies. There are multiple auditory areas (much like the multiple areas in the visual cortex), which can be distinguished anatomically and on the basis that they contain a complete "frequency map." The purpose of this frequency map (known as a tonotopy) likely reflects the fact that the cochlea is arranged according to sound frequency. The auditory cortex is involved in tasks such as identifying and segregating " auditory objects" and identifying the location of a sound in space. For example, it has been shown that A1 encodes complex and abstract aspects of auditory stimuli without encoding their "raw" aspects like frequency content, presence of a distinct sound or its echoes.
Human indicated that a peripheral part of this brain region is active when trying to identify musical pitch. Individual cells consistently get excited by sounds at specific frequencies, or Harmonics of that frequency.
The auditory cortex plays an important yet ambiguous role in hearing. When the auditory information passes into the auditory cortex, the specifics of what exactly takes place are unclear. There is a large degree of individual variation in the auditory cortex, as noted by English biologist James Beament, who wrote, "The cortex is so complex that the most we may ever hope for is to understand it in principle, since the evidence we already have suggests that no two cortices work in precisely the same way."
In the hearing process, multiple sounds are transduced simultaneously. The role of the auditory system is to decide which components form the sound link. Many have surmised that this linkage is based on the location of sounds. However, there are numerous distortions of sound when reflected off different media, which makes this thinking unlikely. The auditory cortex forms groupings based on fundamentals; in music, for example, this would include harmony, timing, and pitch.
The primary auditory cortex lies in the superior temporal gyrus of the temporal lobe and extends into the lateral sulcus and the transverse temporal gyri (also called Heschl's gyri). Final sound processing is then performed by the parietal lobe and Frontal lobe lobes of the human cerebral cortex. Animal studies indicate that auditory fields of the cerebral cortex receive ascending input from the auditory thalamus and that they are interconnected on the same and on the opposite cerebral hemispheres.
The auditory cortex is composed of fields that differ from each other in both structure and function. The number of fields varies in different species, from as few as 2 in to as many as 15 in the rhesus monkey. The number, location, and organization of fields in the human auditory cortex are not known at this time. What is known about the human auditory cortex comes from a base of knowledge gained from studies in , including primates, used to interpret EEG tests and functional imaging studies of the brain in humans.
When each instrument of a symphony orchestra or jazz band plays the same note, the quality of each sound is different, but the musician perceives each note as having the same pitch. The neurons of the auditory cortex of the brain are able to respond to pitch. Studies in the marmoset monkey have shown that pitch-selective neurons are located in a cortical region near the anterolateral border of the primary auditory cortex. This location of a pitch-selective area has also been identified in recent functional imaging studies in humans.
The primary auditory cortex is subject to Neuromodulation by numerous , including norepinephrine, which has been shown to decrease cellular excitability in all layers of the temporal cortex. alpha-1 adrenergic receptor activation, by norepinephrine, decreases Glutamic acid excitatory postsynaptic potentials at .
The primary auditory cortex is organized, which means that neighboring cells in the primary auditory cortex respond to neighboring frequencies. Tonotopic mapping is preserved throughout most of the audition circuit. The primary auditory cortex receives direct input from the medial geniculate nucleus of the thalamus and thus is thought to identify the fundamental elements of music, such as pitch and loudness.
An evoked response study of congenitally deaf kittens used local field potentials to measure cortical plasticity in the auditory cortex. These kittens were stimulated and measured against a control (an un-stimulated congenitally deaf cat (CDC)) and normal hearing cats. The field potentials measured for artificially stimulated CDC were eventually much stronger than that of a normal hearing cat. This finding accords with a study by Eckart Altenmuller, in which it was observed that students who received musical instruction had greater cortical activation than those who did not.
The auditory cortex has distinct responses to sounds in the Gamma wave. When subjects are exposed to three or four cycles of a 40 hertz click, an abnormal spike appears in the EEG data, which is not present for other stimuli. The spike in neuronal activity correlating to this frequency is not restrained to the tonotopic organization of the auditory cortex. It has been theorized that gamma frequencies are resonant frequencies of certain areas of the brain and appear to affect the visual cortex as well. Gamma band activation (25 to 100 Hz) has been shown to be present during the perception of sensory events and the process of recognition. In a 2000 study by Kneif and colleagues, subjects were presented with eight musical notes to well-known tunes, such as Yankee Doodle and Frère Jacques. Randomly, the sixth and seventh notes were omitted and an electroencephalogram, as well as a magnetoencephalogram were each employed to measure the neural results. Specifically, the presence of gamma waves, induced by the auditory task at hand, were measured from the temples of the subjects. The omitted stimulus response (OSR) was located in a slightly different position; 7 mm more anterior, 13 mm more medial and 13 mm more superior in respect to the complete sets. The OSR recordings were also characteristically lower in gamma waves as compared to the complete musical set. The evoked responses during the sixth and seventh omitted notes are assumed to be imagined, and were characteristically different, especially in the right hemisphere. The right auditory cortex has long been shown to be more sensitive to tonality (high spectral resolution), while the left auditory cortex has been shown to be more sensitive to minute sequential differences (rapid temporal changes) in sound, such as in speech.
Tonality is represented in more places than just the auditory cortex; one other specific area is the rostromedial prefrontal cortex (RMPFC). A study explored the areas of the brain which were active during tonality processing, using fMRI. The results of this experiment showed preferential blood-oxygen-level-dependent activation of specific voxels in RMPFC for specific tonal arrangements. Though these collections of voxels do not represent the same tonal arrangements between subjects or within subjects over multiple trials, it is interesting and informative that RMPFC, an area not usually associated with audition, seems to code for immediate tonal arrangements in this respect. RMPFC is a subsection of the medial prefrontal cortex, which projects to many diverse areas including the amygdala, and is thought to aid in the inhibition of negative emotion.
Another study has suggested that people who experience 'chills' while listening to music have a higher volume of fibres connecting their auditory cortex to areas associated with emotional processing.
In a study involving dichotic listening to speech, in which one message is presented to the right ear and another to the left, it was found that the participants chose letters with stops (e.g. 'p', 't', 'k', 'b') far more often when presented to the right ear than the left. However, when presented with phonemic sounds of longer duration, such as vowels, the participants did not favor any particular ear. Due to the contralateral nature of the auditory system, the right ear is connected to Wernicke's area, located within the posterior section of the superior temporal gyrus in the left cerebral hemisphere.
Sounds entering the auditory cortex are treated differently depending on whether or not they register as speech. When people listen to speech, according to the strong and weak speech mode hypotheses, they, respectively, engage perceptual mechanisms unique to speech or engage their knowledge of language as a whole.
Check citations 1 & 3..
|
|